1,850 research outputs found

    Walls and Sexuality as Trans-cultural Symbols: A Study of Rudyard Kipling’s Short Story ‘On the City Wall’

    Get PDF
    This article aims to discuss Rudyard Kipling’s short story ‘On the City Wall’ (1888) from the trans-cultural perspective by analyzing the tropes of wall and sexuality. Kipling’s attachment to Indian culture and love for it is reflected in his fiction when he gives a detailed description of exotic locations and ethnographic peculiarities. The image wall is quite significant to express different expressions as connector, shelter, veil, and boundary while sexuality is mentioned to unite the different mindsets together at one spot. This article, further, traces that by using the tropes of connector, shelter, veil, and boundary, Kipling depicts the inevitability of the confrontation between the colonizer and the colonized and a sense of unity among the natives. The analysis of the discussion results in Kipling’s admission of the failure of racial, cultural, social, and religious hindrance among the different inhabitants of the city-[walled] of an unnamed city [Lahore] before the partition of the subcontinent for being united.

    Novel one time signatures (NOTS) : a compact post-quantum digital signature scheme

    Get PDF
    The future of the hash based digital signature schemes appears to be very bright in the upcoming quantum era because of the quantum threats to the number theory based digital signature schemes. The Shor's algorithm is available to allow a sufficiently powerful quantum computer to break the building blocks of the number theory based signature schemes in a polynomial time. The hash based signature schemes being quite efficient and provably secure can fill in the gap effectively. However, a draw back of the hash based signature schemes is the larger key and signature sizes which can prove a barrier in their adoption by the space critical applications, like the blockchain. A hash based signature scheme is constructed using a one time signature (OTS) scheme. The underlying OTS scheme plays an important role in determining key and signature sizes of a hash based signature scheme. In this article, we have proposed a novel OTS scheme with minimized key and signature sizes as compared to all of the existing OTS schemes. Our proposed OTS scheme offers an 88% reduction in both key and signature sizes as compared to the popular Winternitz OTS scheme. Furthermore, our proposed OTS scheme offers an 84% and an 86% reductions in the signature and the key sizes respectively as compared to an existing compact variant of the WOTS scheme, i.e. WOTS +

    Towards a low complexity scheme for medical images in scalable video coding

    Get PDF
    Medical imaging has become of vital importance for diagnosing diseases and conducting noninvasive procedures. Advances in eHealth applications are challenged by the fact that Digital Imaging and Communications in Medicine (DICOM) requires high-resolution images, thereby increasing their size and the associated computational complexity, particularly when these images are communicated over IP and wireless networks. Therefore, medical research requires an efficient coding technique to achieve high-quality and low-complexity images with error-resilient features. In this study, we propose an improved coding scheme that exploits the content features of encoded videos with low complexity combined with flexible macroblock ordering for error resilience. We identify the homogeneous region in which the search for optimal macroblock modes is early terminated. For non-homogeneous regions, the integration of smaller blocks is employed only if the vector difference is less than the threshold. Results confirm that the proposed technique achieves a considerable performance improvement compared with existing schemes in terms of reducing the computational complexity without compromising the bit-rate and peak signal-to-noise ratio. © 2013 IEEE

    Exploring Documentation: A Trivial Dimension of RUP

    Get PDF
    The Unified Process (UP) methodology is a commonly used methodology which can be followed by that entire process model where perfectly documented and well defined structure of team is needed, like Rational Unified Process (RUP) model which follows the UP methodology.  During documentation, the defect rate of software can be reduced and software quality can be improved. Quality is the sole objective which is pursued by stakeholders throughout the whole life cycle of software development. Quality is not the outcome of an accident; it is the fruit of the continual labor of devoted professionals. As the size of software increases, it is natural for the number of errors and defects to increase. The Cleanroom Software engineering process is a process for software development. The basic objective of Cleanroom Software engineering is to produce high quality of software emphasizing to increase the level of reliability to its utmost efficiency. Moreover, the Cleanroom process is involved in each and every phase of software development life cycle i.e. planning; measurement; specifying design; verifying code; testing; and certifying to mold the entire engineering discipline that the end product should result ideally in zero defect-rate. Keywords: Cleanroom software Engineering process, Documentation, Defect rate, Rational Unified process, quality and reliability

    Transmission Dynamics Model of Coronavirus COVID-19 for the Outbreak in Most Affected Countries of the World

    Get PDF
    The wide spread of coronavirus (COVID-19) has threatened millions of lives and damaged the economy worldwide. Due to the severity and damage caused by the disease, it is very important to fore-tell the epidemic lifetime in order to take timely actions. Unfortunately, the lack of accurate information and unavailability of large amount of data at this stage make the task more difficult. In this paper, we used the available data from the mostly affected countries by COVID-19, (China, Iran, South Korea and Italy) and fit this with the SEIR type model in order to estimate the basic reproduction number R_0. We also discussed the development trend of the disease. Our model is quite accurate in predicting the current pattern of the infected population. We also performed sensitivity analysis on all the parameters used that are affecting the value of R0

    Exact solutions for unsteady axial Couette flow of a fractional Maxwell fluid due to an accelerated shear

    Get PDF
    The velocity field and the adequate shear stress corresponding to the flow of a fractional Maxwell fluid (FMF) between two infinite coaxial cylinders, are determined by means of the Laplace and finite Hankel transforms. The motion is produced by the inner cylinder that at time t = 0+ applies a shear stress fta (a ≥ 0) to the fluid. The solutions that have been obtained, presented under series form in terms of the generalized G and R functions, satisfy all imposed initial and boundary conditions. Similar solutions for ordinary Maxwell and Newtonian fluids are obtained as special cases of general solutions. The unsteady solutions corresponding to a = 1, 2, 3, ... can be written as simple or multiple integrals of similar solutions for a = 0 and we extend this for any positive real number a expressing in fractional integration. Furthermore, for a = 0, 1 and 2, the solutions corresponding to Maxwell fluid compared graphically with the solutions obtained in [1–3], earlier by a different technique. For a = 0 and 1 the unsteady motion of a Maxwell fluid, as well as that of a Newtonian fluid ultimately becomes steady and the required time to reach the steady-state is graphically established. Finally a comparison between the motions of FMF and Maxwell fluid is underlined by graphical illustrations

    Feature extraction and information fusion in face and palmprint multimodal biometrics

    Get PDF
    ThesisMultimodal biometric systems that integrate the biometric traits from several modalities are able to overcome the limitations of single modal biometrics. Fusing the information at an earlier level by consolidating the features given by different traits can give a better result due to the richness of information at this stage. In this thesis, three novel methods are derived and implemented on face and palmprint modalities, taking advantage of the multimodal biometric fusion at feature level. The benefits of the proposed method are the enhanced capabilities in discriminating information in the fused features and capturing all of the information required to improve the classification performance. Multimodal biometric proposed here consists of several stages such as feature extraction, fusion, recognition and classification. Feature extraction gathers all important information from the raw images. A new local feature extraction method has been designed to extract information from the face and palmprint images in the form of sub block windows. Multiresolution analysis using Gabor transform and DCT is computed for each sub block window to produce compact local features for the face and palmprint images. Multiresolution Gabor analysis captures important information in the texture of the images while DCT represents the information in different frequency components. Important features with high discrimination power are then preserved by selecting several low frequency coefficients in order to estimate the model parameters. The local features extracted are fused in a new matrix interleaved method. The new fused feature vector is higher in dimensionality compared to the original feature vectors from both modalities, thus it carries high discriminating power and contains rich statistical information. The fused feature vector also has larger data points in the feature space which is advantageous for the training process using statistical methods. The underlying statistical information in the fused feature vectors is captured using GMM where several numbers of modal parameters are estimated from the distribution of fused feature vector. Maximum likelihood score is used to measure a degree of certainty to perform recognition while maximum likelihood score normalization is used for classification process. The use of likelihood score normalization is found to be able to suppress an imposter likelihood score when the background model parameters are estimated from a pool of users which include statistical information of an imposter. The present method achieved the highest recognition accuracy 97% and 99.7% when tested using FERET-PolyU dataset and ORL-PolyU dataset respectively.Universiti Malaysia Perlis and Ministry of Higher Education Malaysi

    Optimal Control Analysis of Ebola Disease with Control Strategies of Quarantine and Vaccination

    Get PDF
    The 2014 Ebola epidemic is the largest in history, affecting multiple countries in West Africa. Some isolated cases were also observed in other regions of the world
    • …
    corecore